Click HERE to return to the contents page

Robotics in action in insurance


Youtube Link

Translation from English to Bosnian

EY Global

Published on 29 Jun 2017

Chris Lamberton, EMEIA Robotic Process Automation Leader for EY, walks through some examples of how robotics could influence customer interactions in the insurance sector.

English Subtitles

00:00 Robotic Process Automation or software
00:02 robotics is technology that basically
00:06 emulates humans doing manual processes
00:09 across any number of different systems in your organization.
00:11 What we're going to
00:13 show you today is a robot interacting
00:17 with a digital experience for the customer.
00:19 In this case we have a mobile
00:22 app and the mobile app is going to allow
00:24 the customer to raise a claim on their  car insurance.
00:27 That means the customer
00:29 doesn't have to call in it means that
00:31 agents don't have to do lots of typing
00:33 on existing systems.
00:34 It basically streamlines the entire
00:36 claim process and this demo will also
00:40 show how the robot can keep the client
00:42 informed of what's going on with their
00:44 claim and actually help them in
00:47 interacting with suppliers,
00:51 For example, to actually sort out repairs.
00:54 So what you're going to see is the robot popping
00:56 up a claim form what you see is it.
00:58 Is set up to just check from the policy
01:02 which vehicle was actually involved.
01:06 So there's only two choices so that's been
01:09 set up by the robot to only have two
01:12 valid choices depending upon the
01:13 information in the policy.
01:16 Similarly when we actually look at which people were
01:19 involved with the incident only valid
01:21 drivers again pre-populated by the robot
01:24 are displayed on the claim form.
01:29 Because we have the claim form on an
01:33 iPad we can use the full functionality
01:35 of the tablet so in this case we can
01:38 show you know are you actually close to
01:40 the incident where did the incident
01:42 happen using maps and GPS and we also
01:46 can show graphical images of what part
01:50 of the card was involved in the incident
01:52 and by clicking on the various parts of
01:54 the car that were involved it enters the
01:57 information onto the claim form directly.
01:59 We can then obviously put in information
02:02 about the actual incident itself,
02:06 so reversing out of the driveway we hit a
02:07 gate post etc. etc.
02:13 And then we can actually also upload
02:17 images so take pictures of the damage
02:20 directly from the camera use the camera
02:23 in real time or from previous stored
02:26 information you'll see that each one of
02:29 the fields is validated so when did it happen.
02:33 It was whether important what
02:36 type of trip was it all those things are pre done.
02:39 Now what you're seeing is that
02:42 information has been sent to the robot
02:44 and the robot is rapidly entering that
02:47 into an existing claim system.
02:51 So in the right hand side of the on the bottom
02:52 right you're seeing the steps the robot
02:54 is going through and on the left hand side.
02:56 The system it's interacting with
02:58 the insurance company then has a robot
03:02 interacting with any number of legacy
03:04 systems to in order to process that
03:07 claim, so effectively we can digitally
03:10 enable existing systems and effectively
03:13 have a win-win for both the customer and
03:15 the insurance company.
03:17 And what you'll now see is the information being popped
03:23 up on to the mobile app to say that that
03:26 claim has been registered.
03:29 So you see claim tile has popped up on
03:33 the mobile app and when we look at that
03:36 claim, you'll actually see that the time
03:40 to resolve the claim has been put there
03:42 by the robot saying it's four days until
03:46 that claim should be paid.
03:49 That means that the client knows there is a certain
03:52 amount of time to wait and won't keep
03:54 calling in and so effective the robot
03:57 can keep telling their claimant what's
04:00 happening with their claim.
04:03 What you'll also then see is the robot acting at the
04:06 backend of the process.
04:08 In this case it's going to interact with the mechanics
04:11 website to try and schedule a mechanic
04:13 to come and pick up the car.
04:15 So you see that the robot is interacting and
04:17 entering information about the claim
04:18 into the mechanics website and then once
04:22 the mechanic has responded it's going to
04:24 push a notification into the mobile
04:27 that to say your your car will be
04:29 collected at a certain time by Graham
04:32 from Graham's garage.
04:33 Effectively it's an end-to-end process that the robot has
04:37 done across that entire claim.
04:40 The important point to note about this whole
04:42 demo, is that a human was not involved
04:44 other than the claimant in the
04:46 processing of that claim.
04:48 All the entry upfront and then all the subsequent
04:52 interaction with the customer has all
04:55 been done by the robot and so this will
04:57 massively reduce the cost of handling that claim.
05:00 So what we've shown is how a
05:03 digital experience and a robot can
05:06 interact to create a fantastic
05:08 experience for the customer, so that they
05:10 can enter their information in less than
05:13 a minute without having to call in they
05:15 can do it 24 hours a day and what it
05:17 allows organization to do is get the
05:21 robot to focus on the kind of low value
05:24 the data entry there the no value
05:27 process and allow humans to focus on far
05:31 more value-added work.

EY Building a better working world.

  Click HERE to return to the contents page